Numerical optimization for the calculus of variations by gradients on non-Hilbert Sobolev spaces using conjugate gradients and normalized differential equations of steepest descent

نویسنده

  • Ivie Stein
چکیده

The purpose of this paper is to illustrate the application of numerical optimizationmethods for nonquadratic functionals defined on non-Hilbert Sobolev spaces. These methods use a gradient defined on a norm-reflexive and hence strictly convex normed linear space. This gradient is defined by Michael Golomb and Richard A. Tapia in [M. Golomb, R.A. Tapia, Themetric gradient in normed linear spaces, Numer. Math. 20 (1972) 115–124]. It is also the same gradient described by Jean-Paul Penot in [J.P. Penot, On the convergence of descent algorithms, Comput. Optim. Appl. 23 (3) (2002) 279–284]. In this paper we shall restrict our attention to variational problems with zero boundary values. Nonzero boundary value problems can be converted to zero boundary value problems by an appropriate transformation of the dependent variables. The original functional changes by such a transformation. The connection to the calculus of variations is: The notion of a relative minimum for the Sobolev norm for p positive and large and with only first derivatives and function values is related to the classical weak relative minimum in the calculus of variations. The motivation for minimizing nonquadratic functionals on these non-Hilbert Sobolev spaces is twofold. First, a norm equivalent to this Sobolev norm approaches the norm used for weak relative minimums in the calculus of variations as p approaches infinity. Secondly, the Sobolev norm is both norm-reflexive and strictly convex so that the gradient for a non-Hilbert Sobolev space consists of a singleton set; hence, the gradient exists and is unique in this non-Hilbert normed linear space. Two gradient minimization methods are presented here. They are the conjugate gradient methods and an approach that uses differential equations of steepest descent. The Hilbert space conjugate gradient method of James Daniel in [J. Daniel, The Approximate Minimization of Functionals, Prentice-Hall, Inc., Englewood Cliffs, New Jersey, 1971], is one conjugate gradient method extended to a conjugate gradient procedure for a non-Hilbert normed linear space. As a reference see Ivie Stein Jr., [I. Stein Jr., Conjugate gradient methods in Banach spaces, Nonlinear Anal. 63 (2005) e2621–e2628] where local convergence theorems are given. The approach using a differential equation of steepest descent is motivated and described by James Eells Jr. in [J. Eells Jr., A setting for global analysis, Bull. Amer. Math. Soc. 72 (1966) 751–807]. Also a normalized differential equation of steepest descent is used as a numericalminimization procedure in connectionwith starting methods such as higher order Runge–Kuttamethods described byBaylis Shanks in [E. Baylis Shanks, Solutions of differential equations by evaluations of functions, Math. Comput. 20 (1966) 21–38], and higher order multi-step methods such as Adams–Bashforth described ∗ Corresponding address: Department of Mathematics, Mail Stop 942, University of Toledo, 2801 W. Bancroft St., 43606-3390 Toledo, OH, USA. Tel.: +1 419 53

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations

In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...

متن کامل

High-order Sobolev preconditioning

This paper compares the use of firstand second-order Sobolev gradients to solve differential equations using the method of least-squares steepest descent. The use of high-order Sobolev gradients offers a very effective preconditioning strategy for the linear part of a nonlinear differential equation. 2005 Elsevier Ltd. All rights reserved.

متن کامل

Sobolev gradients: a nonlinear equivalent operator theory in preconditioned numerical methods for elliptic PDEs

Solution methods for nonlinear boundary value problems form one of the most important topics in applied mathematics and, similarly to linear equations, preconditioned iterative methods are the most efficient tools to solve such problems. For linear equations, the theory of equivalent operators in Hilbert space has proved an efficient organized framework for the study of preconditioners [6, 9], ...

متن کامل

A Free Line Search Steepest Descent Method for Solving Unconstrained Optimization Problems

In this paper, we solve unconstrained optimization problem using a free line search steepest descent method. First, we propose a double parameter scaled quasi Newton formula for calculating an approximation of the Hessian matrix. The approximation obtained from this formula is a positive definite matrix that is satisfied in the standard secant relation. We also show that the largest eigen value...

متن کامل

Adjoint-based optimization of PDE systems with alternative gradients

In this work we investigate a technique for accelerating convergence of adjoint–based optimization of PDE systems based on a nonlinear change of variables in the control space. This change of variables is accomplished in the differentiate–then–discretize approach by constructing the descent directions in a control space not equipped with the Hilbert structure. We show how such descent direction...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015